Images of the practical work can be found on: https://perso.telecom-paristech.fr/tupin/TPSAR/DATA/images/
Codes of the practical work are available here: https://perso.telecom-paristech.fr/tupin/TPSAR/PROGS
Some useful functions are available in the file mvalab.py.
A useful function to read the images with Télécom-Paristech format is imz2mat
The function to display a SAR image with an adapted thresholding of the values is visusar. the first argument containing the image is compulsary, and the second one is optional and corresponds to a threshold for the display. The value $\alpha$ defines the threshold in this way :
${\rm thres} \:\:=\:\: {\rm vmean} \:\: + \:\: \alpha \: {\rm std}$
vmean is the mean value and std the standard deviation computed on the whole image. Every pixel whose value is above this threshold is displayed with the maximal value.
When $\alpha$ is 0, the image is displayed with the full dynamic range. A usual value to display SAR images is 3.
You have at your disposal a set of images coming from different sensors and with different characteristics on the same area of Flevoland in Netherlands (for each sensor and acquisition mode, an homogeneous area of sea has been selected with mer extension, and an area of farmland with centre extension):
Using an SLC data, check that the sea area follows the distributions given during the course for fully developped speckle by computing the histograms (for phase, real part, imaginary part, intensity and amplitude data). Compute the values of the first moments and the coefficient of variation and comment their values.
In the below work, you can see the histograms to the theoretical distributions. You can see a resonable fit, modulo
some clustering or "bunching up" of values, particularly present in the real and imaginary parts. The coefficient of variation
is not far from the 0.523 figure from the Goodman model
import math
import numpy as npy
import matplotlib.pyplot as plt
import scipy
from scipy import signal
from scipy.stats import gamma, expon, rayleigh, norm
import numpy as np
import numpy
import mvalab
tagnotebook=1
mvalab.notebook(tagnotebook)
pageweb="https://perso.telecom-paristech.fr/tupin/TPSAR/DATA/images/"
image='SentinelSLC_flevoland_mer.cxs'
im_slc_senti_mer_liste=mvalab.imz2mat(pageweb+image);
im_slc_senti_mer = im_slc_senti_mer_liste[0]
ncol=im_slc_senti_mer_liste[1]
nlig=im_slc_senti_mer_liste[2]
mvalab.visusar(im_slc_senti_mer)
#%%
amp_slc_senti_mer=numpy.abs(im_slc_senti_mer);
int_slc_senti_mer=numpy.multiply(amp_slc_senti_mer,amp_slc_senti_mer);
ph_slc_senti_mer=numpy.angle(im_slc_senti_mer);
real_slc_senti_mer=numpy.real(im_slc_senti_mer);
imag_slc_senti_mer=numpy.imag(im_slc_senti_mer);
#fit_alpha, fit_loc, fit_beta = gamma.fit(data)
# Intensity fit (exponential assuming jointly Gaussian noise in real and imaginary channels for single-look)
int_slc_senti_mer_vec = int_slc_senti_mer.reshape((ncol*nlig,))
seuil_int = np.mean(int_slc_senti_mer_vec) + 3*np.std(int_slc_senti_mer_vec)
int_slc_senti_mer_vec = int_slc_senti_mer_vec[int_slc_senti_mer_vec <= seuil_int]
exponential_fit_lambda = 1.0/np.mean(int_slc_senti_mer_vec) #MLE estimate
xmin_i = np.min(int_slc_senti_mer_vec)
xmax_i = np.mean(int_slc_senti_mer_vec)
x_i = np.linspace(xmin_i, seuil_int, 1000)
exp_pdf = exponential_fit_lambda * np.exp(-1.0*exponential_fit_lambda*x_i)
# Amplitude fit
amp_slc_senti_mer_vec = amp_slc_senti_mer.reshape((ncol*nlig,))
seuil_amp = np.mean(int_slc_senti_mer_vec) + 3*np.std(int_slc_senti_mer_vec)
amp_slc_senti_mer_vec = amp_slc_senti_mer_vec[amp_slc_senti_mer_vec <= seuil_amp]
rloc, rscale = rayleigh.fit(amp_slc_senti_mer_vec)
xmin_a = np.min(amp_slc_senti_mer_vec)
xmax_a = np.max(amp_slc_senti_mer_vec)
x_a = np.linspace(xmin_a, xmax_a, 100)
ray_pdf = rayleigh.pdf(x_a, loc=rloc, scale=rscale)
# Phase fit (uniform 1/(2pi), uninformed by the data)
phase = ph_slc_senti_mer.reshape((ncol*nlig,))
x_unif = np.linspace(-1.0*math.pi, math.pi, 100)
unif_pdf = (1.0/(2*math.pi))*np.ones(x_unif.shape)
# Real fit
real = real_slc_senti_mer.reshape((ncol*nlig,))
mean_ = np.mean(real)
std_ = np.std(real)
print("REAL MEAN:", mean_)
print("REAL STD:", std_)
print(std_/mean_)
x_real = np.linspace(np.min(real), np.max(real), 100)
real_pdf = norm.pdf(x_real, loc=mean_, scale=std_)
# Imag fit
imag = imag_slc_senti_mer.reshape((ncol*nlig,))
std_ = np.std(imag)
mean_ = np.mean(imag)
print("IMAG MEAN", mean_)
print("IMAG STD:", std_)
print(std_/mean_)
x_imag = np.linspace(np.min(imag), np.max(imag), 100)
imag_pdf = norm.pdf(x_imag, loc=mean_, scale=std_)
plt.hist(int_slc_senti_mer_vec,bins='auto',normed=True, range=[0,seuil_int]) # 512 sera la valeur max
plt.plot(x_i, exp_pdf)
plt.title("Intensity histogram (excl. outliers) vs. MLE of exp dist fit to non-outliers")
plt.show()
plt.hist(amp_slc_senti_mer_vec,bins='auto',normed=True,range=[0.,512]) # 512 sera la valeur max
plt.plot(x_a, ray_pdf)
plt.title("Amplitude histogram (excl. outliers) vs. Rayleigh fit of non-outliers")
plt.show()
plt.hist(phase,bins='auto',normed=True)
plt.plot(x_unif, unif_pdf)
plt.title("Observed phase vs. uniform pdf")
plt.show()
plt.hist(imag,bins='auto',normed=True)#True,range=[0.,512]) # 512 sera la valeur max
plt.plot(x_imag, imag_pdf)
plt.title("Observed imaginary part (incl. outliers) vs. gaussian fit (incl. outliers)")
plt.show()
plt.hist(real,bins='auto',normed=True)
plt.plot(x_real, real_pdf)
plt.title("Observed real part (incl. outliers) vs. gaussian fit (incl. outliers)")
plt.show()
m_A=numpy.mean(amp_slc_senti_mer)
sigma_A=numpy.std(amp_slc_senti_mer)
coeff_var_A=sigma_A/m_A
print("MEAN (amplitude):", m_A)
print("STD (amplitude):", sigma_A)
print("COEFF_VAR (computed from amplitude):", coeff_var_A)
A common way to reduce the speckle is to multi-look the data.
Use the value of the coefficient of variation to find the number of looks of the Sentinel GRD and ERS data. The formula is :
Comment the number of looks you have found for GRD and ERS data.
image='SentinelGRD_flevoland_mer.imw'
im_senti_GRD_mer=mvalab.imz2mat(pageweb+image)
#mvalab.visusar(numpy.abs(im_senti_GRD_mer[0]))
im_slc_senti_GRD_mer = im_senti_GRD_mer[0].reshape((im_senti_GRD_mer[1]*im_senti_GRD_mer[2],))
amp_slc_senti_GRD_mer=numpy.abs(im_slc_senti_GRD_mer)
int_slc_senti_GRD_mer=numpy.abs(im_slc_senti_GRD_mer)**2
coeff_I = np.std(int_slc_senti_GRD_mer)/np.mean(int_slc_senti_GRD_mer)
coeff_A = np.std(amp_slc_senti_GRD_mer)/np.mean(amp_slc_senti_GRD_mer)
print("ENL of Sentinel GRD (computed from intensity): ", 1.0/np.square(coeff_I))
print("ENL of Sentinel GRD (computed from amplitude): ", 0.523**2/np.square(coeff_A))
image='ERS_flevoland_mer.imw'
im_slc_senti_GRD_mer=mvalab.imz2mat(pageweb+image) # pardon the incorrect variable names, I copy and pasted from above
#mvalab.visusar(numpy.abs(im_ERS_mer[0]))
im_slc_senti_GRD_mer = im_slc_senti_GRD_mer[0].reshape((im_slc_senti_GRD_mer[1]*im_slc_senti_GRD_mer[2],))
amp_slc_senti_GRD_mer=numpy.abs(im_slc_senti_GRD_mer)
int_slc_senti_GRD_mer=numpy.abs(im_slc_senti_GRD_mer)**2
coeff_I = np.std(int_slc_senti_GRD_mer)/np.mean(int_slc_senti_GRD_mer)
coeff_A = np.std(amp_slc_senti_GRD_mer)/np.mean(amp_slc_senti_GRD_mer)
print("ENL of ERS data (computed from intensity): ", 1.0/np.square(coeff_I))
print("ENL of ERS data (computed from amplitude): ", 0.523**2/np.square(coeff_A))
Compute a multi-look down-sampled image of the Alos SLC data using a vertical factor of 4 in the vertical direction and 1 in horizontal (this will give almost square pixels).
Do the same with the SLC Sentinel image using an horizontal factor of 4 and a vertical one of 1.
Comment the effect of multi-looking.
# WARNING to read cxf image you can not use the url
# you have to copy the image locally on your computer
import os
image=os.getcwd()+'/images/Alos_flevoland_centre.cxf'
print(image)
im_alos_centre=mvalab.imz2mat(image)
im = np.abs(im_alos_centre)
mvalab.visusar(im)
im = np.abs(im_alos_centre[0])
from scipy.ndimage import uniform_filter
#average
im_vert = uniform_filter(im, size=(4,1))
im_horiz = uniform_filter(im, size=(1,4))
#downsample
im_vert = im_vert[::4,:]
im_horiz = im_horiz[:,::4]
mvalab.visusar(im_horiz)
mvalab.visusar(im_vert)
plt.figure()
#%%
masque_vert=numpy.ones((4,1))/4
alos_ml_int=signal.convolve2d(numpy.multiply(im_alos_centre[0], numpy.conj(im_alos_centre[0])), masque_vert,mode='same')
alos_ml_ssech_int=alos_ml_int[1:2048:4,:]
mvalab.visusar(numpy.sqrt(alos_ml_ssech_int))
plt.suptitle(u'ALOS : Multivue en intensité')
#% multi-look in amplitude
plt.figure()
alos_ml_amp=signal.convolve2d(im_alos_centre[0], masque_vert,mode='same')
alos_ml_ssech_amp=alos_ml_amp[0:2048:4,:] # attention : syntaxe différente de matlab
mvalab.visusar(alos_ml_ssech_amp)
plt.suptitle(u'ALOS : Multivue en amplitude')
#%%
image='SentinelGRD_flevoland_centre.imw'
plt.figure()
im_sentigrd_centre=mvalab.imz2mat(pageweb+image)
mvalab.visusar(numpy.abs(im_sentigrd_centre[0])) # mode Z : le plt.figure et le plt.show ne seront pas effectués
plt.suptitle(u'Sentinel-1 GRD')
#plt.figure()
image='SentinelSLC_flevoland_centre.cxs'
im_sentislc_centre=mvalab.imz2mat(pageweb+image)
mvalab.visusar(numpy.abs(im_sentislc_centre[0])) # mode Z : le plt.figure et le plt.show ne seront pas effectués
#plt.suptitle(u'Sentinel-1 GRD')
#do the multilooking downsampled using an horizontal factor of 4 and a vertical one of 1.
This coefficient (ratio between standard deviation by mean) is an indication of the local homogeneity of the scene.
Using 2D convolution to speed up the processing, compute the images of coefficient of variation. Comment the results (which structures of the image are highlighted ?) and the influence of the window size.
image='SentinelSLC_flevoland_centre.cxs'
im_slc_senti_centre=mvalab.imz2mat(pageweb+image)
ima_int=numpy.multiply(im_slc_senti_centre[0], numpy.conj(im_slc_senti_centre[0]))
size_window=11;
masque_loc=numpy.ones((size_window,size_window))/(size_window*size_window)
ima_intcarre=numpy.multiply(ima_int,ima_int)
ima_int_mean=signal.convolve2d(ima_int, masque_loc,mode='same')
plt.figure(1);
mvalab.visusar(numpy.sqrt(ima_int));
plt.figure(2);
mvalab.visusar(numpy.sqrt(ima_int_mean));
ima_int_mean_carre=signal.convolve2d(ima_intcarre, masque_loc,mode='same')
ima_variance=ima_int_mean_carre-numpy.multiply(ima_int_mean,ima_int_mean)
ima_coeff_var=numpy.divide(numpy.sqrt(ima_variance),ima_int_mean)
plt.figure(3);
mvalab.visusar(ima_coeff_var)
This local coefficient is also used in a very famous filter for SAR images: Lee filter. The principle of the filter is to combine the pixel value $I_s$ (intensity value of pixel $s$) and the local mean $\hat{\mu}_{s}$ depending on the local coefficient of variation $\hat{\gamma}_s$ with the following formula : $ \hat{I}_s= \hat{\mu}_{s}+k_s (I_s-\hat{\mu}_{s}) $
and $ k_s=1- \frac{\gamma_S}{\hat{\gamma}_s} $
$\gamma_S$ is the theoretical value of the coefficient of variation for a pure speckle ($\gamma_S=\frac{1}{\sqrt{L}}$ for a L-look intensity image).
Using the previous map, compute the resulting image with Lee filter. Comment the result and compare with a local mean.
Warning : $k$ should be in $[0,1]$.
m,n = np.shape(ima_coeff_var)
window_M = 3
window_N = 3
L = window_M * window_N
gamma_s = 1 / numpy.sqrt(L)
k_s = 1 - (gamma_s * npy.ones((m,n))) / ima_coeff_var
print(np.min(k_s))
print(np.max(k_s))
I_s_hat = ima_int_mean + k_s * (ima_int - ima_int_mean)
plt.figure();
mvalab.visusar(ima_int)
mvalab.visusar(ima_coeff_var)
mvalab.visusar(I_s_hat)